2 research outputs found

    Linear Cholesky decomposition of covariance matrices in mixed models with correlated random effects

    No full text
    Modelling the covariance matrix in linear mixed models provides an additional advantage in making inference about subject-specific effects, particularly in the analysis of repeated measurement data, where time-ordering of the responses induces significant correlation. Some difficulties encountered in these modelling procedures include high dimensionality and statistical interpretability of parameters, positive definiteness constraint and violation of model assumptions. One key assumption in linear mixed models is that random errors and random effects are independent, and its violation leads to biased and inefficient parameter estimates. To minimize these drawbacks, we developed a procedure that accounts for correlations induced by violation of this key assumption. In recent literature, variants of Cholesky decomposition were employed to circumvent the positive definiteness constraint, with parsimony achieved by joint modelling of mean and covariance parameters using covariates. In this article, we developed a linear Cholesky decomposition of the random effects covariance matrix, providing a framework for inference that accounts for correlations induced by covariate(s) shared by both fixed and random effects design matrices, a circumstance leading to lack of independence between random errors and random effects. The proposed decomposition is particularly useful in parameter estimation using the maximum likelihood and restricted/residual maximum likelihood procedures
    corecore